Concepedia

Concept

affective computing

Parents

46K

Publications

3.5M

Citations

92.2K

Authors

9.3K

Institutions

Multimodal Affective Interfaces

1989 - 1995

During 1989–1995, the field coalesced around integrating sensing, interpretation, and generation of emotion in human-computer interaction. Research emphasized multimodal emotion recognition (facial expressions, physiological cues) and affective interfaces, alongside psychometric measures of user states to guide interaction design. The period established standard stimuli and scalable annotation approaches that enabled early emotion-aware computing.

Emotion recognition and affective interface development unify sensing, interpretation, and generation of emotion in HCI, spanning facial-expression recognition, multimodal cues, and emotion-driven animation [2], [3], [4], [6], [10], [11], [12], [14].

Affective state measurement and attitudes in computing converge psychometric scales of stress, hassles, anxiety, and trust, illustrating how user affect governs technology interaction [1], [7], [9], [15], [18], [19].

Computational analysis and representation of facial expressions and smiles underpins recognition and animation research, including spatio-temporal modeling, smiling quantification, and multimodal facial systems [2], [3], [5], [8], [10], [12].

Language and semantic framing of emotions shape theory and application in HCI, connecting affective semantics to emotion recognition, regulation, and social cognition [9], [13], [14], [16], [17].

Multimodal Affective Computing

1996 - 2013

In-the-Wild Multimodal Affective Computing

2014 - 2020

Self-Supervised Multimodal Fusion

2021 - 2023